![]() MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
专利摘要:
The present invention relates to a mobile terminal (100) comprising a view capturing apparatus (121), a touch screen configured to display an image received from the view capturing apparatus (121) and a configured controller (180) to change a capture magnification of the image capture apparatus (121) according to a specific type of touch applied to the touch screen, and the controller (180) activates a specific function associated with the capture apparatus of views (121) according to the touch applied to the touch screen in a state in which an image acquired at a preset magnification is displayed. 公开号:FR3039674A1 申请号:FR1653244 申请日:2016-04-13 公开日:2017-02-03 发明作者:Arim Kwon;Hyungsun Kim;Cheongha Park;Sangwoon Lee;Yoomee Song;Jungmin Park;Hyerim Ku 申请人:LG Electronics Inc; IPC主号:
专利说明:
MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME The present invention relates to a mobile terminal capable of capturing an external environment by using a view capturing apparatus. The mobile terminals may include all types of devices configured to have a battery and a display unit 151, and display information on the display unit 151 using electrical power supplied from the battery, and trained to allow a user to wear it manually. The mobile terminal may include a device configured to record and play a video and a device configured to display a graphical user interface (GUI), and may include a laptop, a hand-held phone, glasses, a watch, an game machine, and the like capable of displaying screen information. As it becomes multifunctional, a mobile terminal may be allowed to capture inanimate images or moving images, play music or video files, play games, receive a broadcast, and the like, in order to be implemented as an integrated media player. In addition, efforts are underway to support and increase the functionality of mobile terminals. Such efforts include software and hardware enhancements, as well as changes and improvements to the structural components. In recent years, as the functions associated with image capturing devices have become more diversified due to the improved performance of image capturing devices, it has become problematic to perform a complicated process for setting or executing desired functions. an user. In addition, in order to revise an image or control a specific function using the image, it is disadvantageous that the method of capturing an image must be performed first. Therefore, the technical task of the present invention is to immediately perform a function associated with a picture capture apparatus by using a specific touch applied to a preview image while the preview image is displayed. In order to accomplish the foregoing task of the present invention, a mobile terminal according to one embodiment may include a picture capture apparatus, a touch screen configured to display an image received from the picture capturing apparatus, and a controller configured to change a capture magnification of the image capturing apparatus according to a specific type of touch applied to the touch screen, wherein the controller activates a specific function associated with the image capturing apparatus depending on the touch applied to the touch screen in a state in which an image acquired at a preset magnification is displayed. Therefore, a user can perform his desired function quickly without entering a setting change mode to set a function associated with the picture capturing apparatus or display a setting screen. According to an example associated with the present invention, when the specific type of touch is applied, the controller can separate the touch screen into a plurality of capture control regions, and the controller can control screen information displayed in plurality. capture control regions independently according to a touch applied to the plurality of capture control regions. According to an example associated with the present invention, the mobile terminal may further include a wireless communication unit configured to perform wireless communication with an external device having a view capturing apparatus according to the specific type of touch, wherein the image and an image acquired by the external device capture apparatus are displayed in a plurality of capture control regions, respectively, of the touch screen separated by the specific type of touch. As a result, the user can perform wireless communication with an external device in a state in which an image is displayed to receive an image from another device. According to an example associated with the present invention, the controller can store an image displayed on the touch screen when the specific type of touch is applied while forming a video file with a plurality of images acquired through the apparatus. capture of views, thus allowing the user to independently store the image without stopping the video capture. According to the present invention, a function associated with the view capturing apparatus can be realized through a specific type of touch while displaying an acquired through the view capturing apparatus without realizing an additional method, thereby enabling the user to perform his desired function quickly without switching the image capture apparatus back to an inactive state. In addition, an image to which a specific function or other information associated with the image can be provided via a plurality of capture control regions of the separate touch screen, thereby allowing the user to check a result of its desired function in advance and then to control a capture operation. The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated herein, and form a part thereof, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention. the invention. In the drawings: Fig. 1A is a block diagram for explaining a mobile terminal associated with the present invention; Figs. 1B and 1C are views in which a mobile terminal associated with the present invention is viewed from different directions; Fig. 2A is a flowchart for explaining a method of controlling a mobile terminal according to an embodiment of the present invention; Figure 2B is a conceptual view for explaining the control method of Figure 2A; Figs. 3A-3C are conceptual views for explaining a method of controlling the notification of the execution of a function associated with a touch-capture device; Figs. 4A-4C are conceptual views for explaining a control method of capturing control regions independently; Figs. 5A to 5C are conceptual views for explaining a control method of controlling a preview image displayed on a capture control region; Figs. 6A-6C are conceptual views for explaining a method of controlling the execution of a function associated with an external device; Figs. 7A and 7B are conceptual views for explaining a control method for controlling a change in a capture mode; Figs. 8A to 8C are conceptual views for explaining a control method of the separation control of the touch screen 151; Figs. 9A-9C are conceptual views for explaining a control method for controlling a front view capturing apparatus; Figs. 10A-10C are conceptual views for explaining a method of controlling the capture of an image; Figs. 11A-11C are conceptual views for explaining a method of controlling the display of an image previously stored; and Figs. 12A-12C are conceptual views for explaining a function associated with a picture capturing apparatus performed by a specific control command. A description is now provided in detail according to the illustrative embodiments described herein, with reference to the accompanying drawings. For the purpose of a brief description with reference to the drawings, the same reference numbers are given to the same components or equivalent components, and their description will not be repeated. A suffix "module" and "unit" used for constituent elements described in the following description is simply intended for easy description of the memory, and the suffix itself does not give any meaning or special function. In the description of the present invention, if a detailed explanation for a related known function or construction is considered to depart unnecessarily from the general idea of the present invention, such an explanation has been omitted, but would be understood by the man of the present invention. job. The accompanying drawings are used to assist in easily understanding the technical idea of the present invention and it should be understood that the idea of the present invention is not limited by the accompanying drawings. The idea of the present invention is to be interpreted as extending to any modifications, equivalents and substitutes in addition to the accompanying drawings. The mobile terminals described herein may include cell phones, smart phones, laptops, digital broadcast terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, slate computers, ultra-portable computers, portable devices (eg, smart watches, smart glasses, head-mounted (HMD)), and the like. However, those skilled in the art can readily appreciate that the configuration according to the illustrative embodiments of this memo may also be applied to stationary terminals, such as digital TVs, desktops and the like, excluding a case in which it is applicable only to mobile terminals. With reference to FIGS. 1A-1C, FIG. 1A is a block diagram of a mobile terminal for explaining a mobile terminal associated with the present invention, and FIGS. 1B and 1C are conceptual views on which an example of the mobile terminal is shown. is seen from different directions. The mobile terminal 100 may include components, such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, an electric power supply unit 190 and the like. Figure 1A illustrates the mobile terminal having various components, but it can be understood that the implementation of all the illustrated components is not a necessity. More or less components may alternatively be implemented. In more detail, the wireless communication unit 110 of these components may typically include one or more modules that allow wireless communications between the mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal. mobile terminal 100, or between the mobile terminal 100 and a network within which another mobile terminal 100 (or an external server) is located. For example, the wireless communication unit 110 may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-distance communication module 114, a location information module 115 and the like. The input unit 120 may include a picture capture apparatus 121 for inputting an image signal, a microphone 122 or an audio input module for inputting an audio signal, or a user input unit 123 (for example: for example, a touch key, a push button (or a mechanical key), etc.) to allow a user to enter information. Audio data or image data collected by the input unit 120 may be analyzed and processed via a user control command. The detection unit 140 may include at least one sensor that detects internal information of the mobile terminal, and / or a surrounding environment of the mobile terminal and / or user information. For example, the detection unit 140 may include a proximity sensor 141, a lighting sensor 142, a touch sensor, an acceleration sensor, a magnetic sensor, a gravity sensor, a gyro sensor, a sensor movement, an RGB sensor, an infrared (IR) sensor, a digital scanning sensor, an ultrasonic sensor, an optical sensor (for example, refer to the camera 121), a microphone (refer to the number reference 122), a battery meter, an environmental sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor, a gas sensor, etc.), and a chemical sensor (for example, an electronic nose, a health sensor, a biometric sensor, etc.). On the other hand, the mobile terminal described herein can use information by combining information detected by at least two of these sensors. Output unit 150 may be configured to output an audio signal, a video signal, or a touch signal. The output unit 150 may include a display unit 151, an audio output module 152, a haptic module 153, an optical output module 154, and the like. The display unit 151 may have an inter-layer structure or an integrated structure with a touch sensor to implement a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as serve as a user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 may interface with various types of external devices connected to the mobile terminal 100. The interface unit 160, for example, may include wired or wireless headphone ports, ports, and ports. external power supply, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input / output (I / O) ports, video I / O ports, headset ports, or the like. The mobile terminal 100 may perform appropriate control associated with an externally connected device in response to the external device being connected to the interface unit 160. The memory 170 may store a plurality of application programs (or applications) executed in the mobile terminal 100, data for operations of the mobile terminal 100, instruction words, and the like. At least some of these application programs can be downloaded from an external server via wireless communication. Some other such application programs may be installed inside the mobile terminal 100 at the time of shipment for basic functions of the mobile terminal 100 (for example, receiving a call, performing a call, receiving a message, sending a message, etc.). On the other hand, the application programs can be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) of the mobile terminal 100. The controller 180 may typically control overall operation of the mobile terminal 100 in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are inputted or outputted by the aforementioned components, or by activating the stored application programs in the memory 170. The controller 180 can control at least a portion of the components illustrated in FIG. 1, in order to control the application programs stored in the memory 170. In addition, the controller 180 can control the application programs by combining at least two of the components included in the mobile terminal 100 for operation. The electric power supply unit 190 may receive external electric power or internal electrical energy and provide appropriate electrical energy required to operate respective elements and components included in the mobile terminal 100 under the Control of the controller 180. The power supply unit 190 may include a battery, and the battery may be a recessed battery or a replaceable battery. At least some of these elements and components may be associated with the implementation of the operation and control of the mobile terminal or a method of controlling the mobile terminal according to various illustrative embodiments described herein. Also, the operation and the control or the control method of the mobile terminal can be implemented in the mobile terminal by activating at least one application program stored in the memory 170. Hereinafter, each aforementioned component will be described in more detail with reference to FIG. 1A, before explaining various illustrative embodiments implemented by the mobile terminal 100 having the configuration. First, by considering the wireless communication unit 110, the broadcast receiving module 111 of the wireless communication unit 110 may receive a broadcast signal and / or information associated with the broadcast from an external broadcast management entity via a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. At least two broadcast reception modules 111 may be provided in the mobile terminal 100 to simultaneously receive at least two broadcast channels or to switch the broadcast channels. The mobile communication module 112 can transmit / receive wireless signals to at least one of at least one of network entities, for example, a base station, an external mobile terminal, an server, and the like, over a mobile communication network, which is constructed according to technical standards or transmission methods for mobile communications (eg, global mobile communication system (GSM), code division multiple access (CDMA) ), Broadband Code Division Multiple Access (WCDMA), High Speed Downlink Packet Access (HSDPA), Long Term Evolution (LTE) technology, etc.) Here, the wireless signals may include an audio call signal, a video call signal (telephony), or various data formats depending on the transmission / reception of text / multimedia messages. The wireless Internet module 113 denotes a module for wireless Internet access. This module can be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 can transmit / receive wireless signals over communication networks using wireless Internet technologies. Examples of such wireless internet access may include wireless local area network (WLAN), Wi-Fi Direct, DLNA, wireless broadband (Wibro), WIMAX, HSDPA, LTE, and the like. The wireless Internet module 113 can transmit / receive data according to at least one wireless Internet technology within a range including even Internet technologies that are not mentioned above. Since wireless Internet access according to Wibro, HSDPA, GSM, CDMA, WCDMA, LTE and the like is performed via a mobile communication network, the wireless Internet module 113 which provides wireless Internet access. via the mobile communication network can be understood as being a type of the mobile communication module 112. The short-range communication module 114 denotes a module for short-distance communications. Suitable technologies for implementing short-range communications may include BLUETOOTH ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), Wi-Fi Fi, Wi-Fi Direct, and the like. The short-range communication module 114 can support wireless communications between the mobile terminal 100 and a wireless communication system, between the mobile terminal 100 and another mobile terminal 100, or between the mobile terminal and a network where another mobile terminal 100 (or an external server) is located, via wireless personal networks. Here, the other mobile terminal 100 may be a portable device, for example, a smart watch, smart glasses or a headset (HMD), which is capable of exchanging data with the mobile terminal 100 (or cooperating with the mobile terminal). mobile terminal 100). The short-distance communication module 114 can detect (recognize) a portable device, which is capable of communicating with the mobile terminal), near the mobile terminal 100. In addition, when the detected portable device is a device that is authenticated to communicate with the mobile terminal 100 according to the present invention, the controller 180 can transmit at least a portion of the processed data in the mobile terminal 100 to the portable device via the short-distance communication module 114. Thus, a user of the portable device can use the processed data in the mobile terminal 100 on the portable device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the portable device. Also, when a message is received in the mobile terminal 100, the user can check the received message using the portable device. The location information module 115 denotes a module for detecting or calculating a position of the mobile terminal. An example of the location information module 115 may include a global positioning system (GPS) module or a Wi-Fi module. For example, when the mobile terminal uses the GPS module, a position of the mobile terminal can be acquired by using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits or receives a signal without at or from the Wi-Fi module. As appropriate, the location information module 115 may perform any function of another module of the wireless communication unit 110 to obtain data for the position of the mobile terminal in a substitutional or additional manner. The location information module 115 may be a module used to acquire the position (or current position) of the mobile terminal, and is not necessarily limited to a module for directly calculating or acquiring the position of the mobile terminal. Hereinafter, the input unit 120 will be described in more detail. The input unit 120 may be configured to provide an audio or video signal (or information) input to the mobile terminal or information entered by a user into the mobile terminal. For the input of the audio information, the mobile terminal 100 may include one or a plurality of image capturing devices 121. The image capturing apparatus 121 may process image views of inanimate images or video obtained by image sensors in a video call mode or a capture mode. The processed image views may be displayed on the display unit 151. On the other hand, the plurality of image capturing devices 121 disposed in the mobile terminal 100 may be arranged in a matrix configuration. By the use of the image capture devices 121 having the matrix configuration, a plurality of image information having various angles or focal points can be input to the mobile terminal 100. Also, the plurality of image capturing devices 121 may be arranged in a stereoscopic structure to acquire a left image and a right image to implement a stereoscopic image. The microphone 122 can process an external audio signal into electrical audio data. The processed audio data may be used in a variety of ways depending on a function performed in the mobile terminal 100 (or an executed application program). On the other hand, the microphone 122 may include% noise canceling algorithms to eliminate noise generated during the reception of the external audio signal. The user input unit 123 may receive information entered by a user. When information is inputted through the user input unit 123, the controller 180 may control operation of the mobile terminal 100 to match the information entered. The user input unit 123 may include a mechanical input element (or a mechanical key, for example, a button located on a front / rear surface or a side surface of the mobile terminal 100, a curved switch, pulse, a pulse switch, etc.), and a touch input means. By way of example, the touch input means may be a virtual key, a programmable key or a visual key, which is displayed on a touch screen by software processing, or a touch key which is arranged on a portion, on the screen. exception of the touch screen. On the other hand, the virtual key or the visual key can be displayed on the touch screen in various forms, for example, graphic, textual, iconic, video or an association thereof. The detection unit 140 can detect internal information of the mobile terminal, and / or surrounding environment information of the mobile terminal and / or user information, and generate a corresponding detection signal. The controller 180 may control operation of the mobile terminal 100 or perform data processing, a function or an operation associated with an application program installed in the mobile terminal depending on the detection signal. Hereinafter, a description will be given in more detail of sensors representative of various sensors that may be included in the detection unit 140. First, a proximity sensor 141 refers to a sensor for detecting the presence or the absence of an object, approaching a surface, intended to be detected, or an object, disposed near a surface, intended to be detected, using an electromagnetic field or infrared rays without mechanical contact . The proximity sensor 141 may be arranged in an interior region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141 may have a longer useful life and an improved utility over a contact sensor. The proximity sensor 141, for example, may include a transmissive type photoelectric sensor, a specular reflection photoelectric sensor, a mirror reflection photoelectric sensor, a high frequency oscillation proximity sensor, a capacitive type proximity sensor, a magnetic type proximity sensor, an infrared proximity sensor, and so on. When the touch screen is implemented as a capacitive type, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field, which is sensitive to an approach of an object with conductivity. In this case, the touch screen (touch sensor) can be categorized as a proximity sensor. Hereinafter, for the brevity of the explanation, a state in which the pointer is positioned to be close on the contactless touch screen will be called "touch in proximity", while a state in which the pointer substantially enters in touch with the touch screen will be called 'touch contact'. For the position corresponding to the touch near the pointer on the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen when touching the pointer. The proximity sensor 141 can detect a touch in proximity, and touch profiles in proximity (for example, distance, direction, speed, time, position, moving state, etc.). On the other hand, the controller 180 can process data (or information) corresponding to the proximity and proximity touch profiles detected by the proximity sensor 141, and output visual information corresponding to the process data on the proximity sensor. 'touchscreen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data (or information) depending on whether a touch relative to the same point on the touch screen is a touch in proximity or a touch contact. A touch sensor may detect a touch (or touch input) applied to the touch screen (or display unit 151) using at least one of a variety of types of touch methods, such as a resistive type, a capacitive type, an infrared type, a magnetic field type, and the like. For example, the touch sensor may be configured to convert changes of pressure applied to a specific portion of the display unit 151 or a capacitance occurring from a specific portion of the display unit. 151, in electrical input signals. Also, the touch sensor can be configured to detect not only an affected position and an affected area, but also tactile pressure. Here, a touch object is an object for applying a touch input to the touch sensor. Examples of the touch object may include a finger, a touch pen, a stylus, a pointer or the like. When touch inputs are detected by the touch sensors, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals, and then transmit corresponding data to the controller 180. Therefore, the controller 180 can detect the region of the display unit 151 that has been touched. Here, the touch controller may be a separate component of the controller 180 or the controller 180 itself. On the other hand, the controller 180 may perform a different control or the same control depending on a type of an object that touches the touch screen (or a touch key provided in addition to the touch screen). Whether to perform the different control or the same control depending on the object that provides a touch input can be decided based on a current state of operation of the mobile terminal 100 or a currently running application program. In the meantime, the touch sensor and the proximity sensor can be executed individually or in combination, to detect various types of touch, such as a short touch (or tap), a long touch, a multiple touch, a touch-pull, a fast-hitting, a close-in pinch, a nip, a touch-slide, a floating feel, and the like. An ultrasonic sensor may be configured to recognize positional information relating to a detector object using ultrasonic waves. The controller 180 can calculate a position of a wave generation source based on information detected by a lighting sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, a time for the light to reach the optical sensor can be much shorter than a time for the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. In more detail, the position of the wave generation source can be calculated using a time difference, with respect to time, for the ultrasonic wave to reach its goal, as a function of the light as a reference signal. . The image capture apparatus 121 constituting the input unit 120 may be a type of image capturing camera sensor. The image capture apparatus sensor may include at least one of a photo sensor and a laser sensor. The image capture apparatus 121 and the laser sensor may be associated to detect a touch of the sensor object relative to a 3D stereoscopic image. The photo-sensor can be laminated on the display device. The photo sensor can be configured to scan a motion of the sensor object near the touch screen. In more detail, the photo-sensor may include photodiodes and row and column transistors for scanning content placed on the photo-sensor using an electrical signal that changes according to the amount of light applied. Namely, the photo-sensor can calculate the coordinates of the detector object according to a variation of light, thereby obtaining position information of the detector object. The display unit 151 may output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. The display unit 151 may also be implemented as a stereoscopic display unit for displaying stereoscopic images. The stereoscopic display unit 152 may use a stereoscopic display system, such as a stereoscopic system (eyeglass system), an auto-stereoscopic system (a system without a bezel), a projection system (holographic system), or like. The audio output module 152 may output audio data received from the wireless communication unit 110 or stored in the memory 170 in a call waiting receiving mode, a calling mode, a calling mode a voice recognition mode, a broadcast reception mode, and the like. Also, the audio output module 152 may also provide audible output signals related to a particular function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal. The audio output module 152 may include a receiver, a speaker, a buzzer, or the like. A haptic module 153 can generate various tactile effects that the user can feel. A typical example of the tactile effect generated by the haptic module 153 may be vibration. The intensity, profile, and the like of the vibration generated by the haptic module 153 may be controllable by user selection or controller setting. For example, the haptic module 153 may output different vibrations in a combined or sequential manner. In addition to vibration, the haptic module 153 can generate various other tactile effects, including a pacing effect, such as an arrangement of needles moving vertically with respect to a contact skin, a spraying force or force of suctioning air through a jet orifice or suction opening, touching the skin, contacting an electrode, electrostatic force, etc., an effect by reproducing the sensation of cold and heat in using an element that can absorb or generate heat, and the like. The haptic module 153 can be implemented to allow the user to feel a tactile effect through muscle sensation, for example by the fingers or the user's arm, as well as to transfer the tactile effect by direct contact. . Two or more haptic modules 153 may be provided depending on the configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source. Examples of events generated in the mobile terminal 100 may include message reception, call waiting reception, missed call, alarm, calendar notification, e-mail reception, receipt of information. through an application, and the like. A signal outputted from the optical output module 154 may be implemented such that the mobile terminal transmits monochromatic light or light with a plurality of colors. The output signal may be terminated when the mobile terminal detects a user's verification of an event. The interface unit 160 can serve as an interface with each external device connected to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive energy electrical to be transferred to each element within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to an external device. For example, the interface unit 160 may include wired or wireless headphone ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connect a device having an identification module, audio input / output (I / O) ports, video I / O ports, headset ports, or the like. The identification module may be a chip that stores various information for authenticating an authority for using the mobile terminal 100 and may include a user identification module (UIM), a subscriber identification module (SIM), and a Universal Subscriber Identification Module (USIM), and the like. In addition, the device having the identification module (called "identification device", hereinafter) can take the form of a smart card. Therefore, the identification device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external dock, the interface unit 160 can be used as a passageway to allow electrical power from the docking station to be supplied to the mobile terminal. 100 through it or can serve as a gateway to allow various control signals entered by the user, from the docking station, to be transferred to the mobile terminal through it . Various control signals or electrical energy input from the docking station can serve as signals to recognize that the mobile terminal is properly mounted on the docking station. Memory 170 may store programs for controller 180 operations and temporarily store input / output data (eg, directory, messages, inanimate images, videos, etc.). The memory 170 can store data, related to various vibration and audio profiles, that are outputted in response to touch inputs on the touch screen. The memory 170 may include at least one type of storage medium, including a flash memory, a hard disk, a multimedia microcard, a card-type memory (for example, SD or XD memory, etc.), a random access memory ( RAM), a static random access memory (SRAM), a read only memory (ROM), an erasable and electrically programmable read only memory (EEPROM), a programmable read only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk. Also, the mobile terminal 100 can be operated with respect to a web storage device that performs the storage function of the memory 170 via the Internet. As mentioned above, the controller 180 can typically control the general operations of the mobile terminal 100. For example, the controller 180 can set or release a lock state to prevent a user from entering a control command with respect to applications when a state of the mobile terminal satisfies a pre-set condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, and the like, or perform profile recognition processing to recognize a handwritten input or a drawing input made to it. touch screen in the form of characters or images, respectively. Further, the controller 180 may control one or a combination of these components to implement various illustrative embodiments described herein on the mobile terminal 100. The power supply unit 190 may receive external electric power or internal electrical energy and provide appropriate electrical energy required to operate respective elements and components included in the mobile terminal 100 under the control of the controller 180. The power supply unit Electric 190 may include a battery. The battery may be a recessed battery that is rechargeable or separably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port may be configured, by way of example, of the interface unit 160 to which an external (re) charger for providing electrical power to recharge the battery is electrically connected. As another example, the power supply unit 190 may be configured to recharge the wireless battery without using the connection port. Here, the electric power supply unit 190 can receive electrical energy, transferred from an external wireless electric power transmitter, using at least one of an induction coupling method which is based on magnetic induction or a magnetic resonance coupling method that is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium or the like using, for example, software, hardware, or any combination thereof. Referring to Figures 1B and 1C, the mobile terminal 100 described herein may be provided with a bar-type terminal body. However, the present invention may not be limited to this, but may also be applicable to various structures, such as a watch type, a bar type, a bezel type, or as a folding type, flap type, sliding type, rocking type, pivot type, or the like, in which two or more bodies are relatively movably associated with each other. Here, the terminal body can be understood as a design that indicates that the mobile terminal 100 is at least one set. The mobile terminal 100 may include a housing (box, housing, cover, etc.) forming the appearance of the terminal. In this embodiment, the housing can be divided into a front housing 101 and a rear housing 102. Various electronic components can be incorporated into a space formed between the front housing 101 and the rear housing 102. At least one middle housing can be disposed in addition between the front housing 101 and the rear housing 102. A display unit 151 may be disposed on a front surface of the terminal body to output information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some cases, electronic components may also be mounted on the back box 102. Examples of such electronic components mounted on the back box 102 may include a separable battery, an identification module, a memory card, and the like. Here, a rear cover 103 for covering the mounted electronic components can be releasably coupled to the rear housing 102. Thus, when the rear cover 103 is separated from the rear housing 102, the electronic components mounted on the rear housing 102 can be externally exposed. . As illustrated, when the rear cover 103 is coupled to the rear housing 102, a side surface of the rear housing 102 may be partially exposed. In some cases, during mating, the rear housing 102 may also be completely protected by the rear cover 103. On the other hand, the rear cover 103 may include an opening for externally exposing a 121b image capture apparatus or audio output module 152b. The housings 101, 102, 103 may be formed by synthetic resin injection molding or may be formed of a metal, for example, stainless steel (STS), titanium (Ti), or the like. Unlike the example in which the plurality of housings form an interior space for accommodating such diverse components, the mobile terminal 100 may be configured such that a housing forms the interior space. In this example, a mobile terminal 100 having a uni-body formed such that synthetic resin or metal extends from a side surface to a back surface can also be implemented. On the other hand, the mobile terminal 100 may include a water seal unit (not shown) to prevent introduction of water into the terminal body. For example, the water seal unit may include a water seal member that is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the rear cover 103, for sealing an interior space when these housings are coupled. The mobile terminal 100 may include a display unit 151, first and second audio output modules 152a and 152b, a proximity sensor 141, a lighting sensor 152, an optical output module 154, first and second devices 121a and 121b views capture, first and second handling units 123a and 123b, a microphone 122, an interface imity 160 and the like. Hereinafter, a description will be provided of an illustrative mobile terminal 100 in which the display unit 151, the first audio output module 152a, the proximity sensor 141, the illumination sensor 142, the display module optical output 154, the first image capture apparatus 121a and the first manipulation unit 123a are disposed on the front surface of the terminal body, the second manipulation unit 123b, the microphone 122 and the interface unit 160 are arranged on a lateral surface of the terminal body, and the second audio output module 152b and the second image capture apparatus 121b are disposed on a rear surface of the terminal body, with reference to Fig. 1C. Here, these components may not be limited to this arrangement, but may be excluded or arranged on another surface if necessary. For example, the first handling unit 123a may not be disposed on the front surface of the terminal body, and the second audio output module 152b may be disposed on the side surface rather than the rear surface of the terminal body. The display unit 151 may output processed information in the mobile terminal 100. For example, the display imitator 151 may display run screen information of an application program driven in the mobile terminal 100 or user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. The display unit 151 may include at least one of a liquid crystal display (LCD) screen, a thin film transistor (TFT-LCD) liquid crystal display, an organic light emitting diode (OLED), a flexible display screen, a three-dimensional (3D) display screen, an electronic ink display screen. The display unit 151 may be implemented in a number of two, or more, according to a configured aspect of the mobile terminal 100. For example, a plurality of the display units 151 may be arranged on a surface to be spaced apart from each other or integrated with each other, or can be arranged on different surfaces. The display unit 151 may include a touch sensor that detects a touch on the display unit to receive touch control control. When a touch is input to the display unit 151, the touch sensor can be configured to detect that touch and the controller 180 can generate a control command corresponding to the touch. The content that is inputted in a tactile manner may be a text or numerical value, or a menu item that may be indicated or designated in various modes. The touch sensor can be configured as a film having a touch configuration. The touch sensor may be a wire, which is disposed between the window 151a and a display screen (not shown) on a rear surface of the window 151a or disposed in a configuration directly on the rear surface of the window 151a. Or, the touch sensor can be integrally formed with the display screen. For example, the touch sensor may be disposed on a substrate of the display screen or within the display screen. The display unit 151 may form a touch screen together with the touch sensor. Here, the touch screen can serve as a user input unit 123 (see Figure IA). Thus, the touch screen can replace at least some of the functions of the first handling unit 123a. The first audio output module 152a may be implemented as a receiver for transferring voice sounds to the user's ear or loudspeaker to output various alarm sounds or multimedia reproduction sounds. The window 151a of the display unit 151 may include a sound hole for outputting sounds generated from the first audio output module 152a. Here, the present invention may not be limited to this. It can also be configured so that sounds are released along an assembly gap between the structural bodies (for example, a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or may be hidden in appearance, thus further simplifying the appearance of the mobile terminal 100. The optical output module 154 may output light to indicate event generation. Examples of the event generated in the mobile terminal 100 may include message reception, call waiting reception, missed call, alarm, calendar notification, e-mail reception, reception of information via an application, and the like. When an event control by a user is detected, the controller may control the optical output unit 154 to stop light output. The first image capture apparatus 121a may process video views such as inanimate or animated images obtained by the image sensor in a video call mode or a capture mode. The processed video views may be displayed on the display unit 151 or stored in the memory 170. The first and second handling units 123a and 123b are examples of the user input unit 123, which can be manipulated by a user to input a command to control the operation of the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as a handling portion, and may use any method if it is a tactile manner allowing the user to perform manipulation with a tactile sensation, such as a touch, push, scroll, or the like. The drawings are illustrated on the basis that the first handling unit 123a is a touch key, but the present invention may not be necessarily limited thereto. For example, the first handling unit 123a can be configured with a mechanical key, or a combination of a touch key and a push button. The content received by the first and second handling units 123a and 123b can be set in various ways. For example, the first handling unit 123a may be used by the user to enter a command such as a menu, a home key, a cancel, a search, or the like, and the second handling unit 123b may be used by the user to input a command, such as controlling a volume level outputted from the first or second audio output module 152a or 152b, switching to a touch recognition mode of the audio output unit. 151, or the like. On the other hand, as another example of the user input unit 123, a rear input unit (not shown) may be disposed on the rear surface of the terminal body. The rear input unit may be manipulated by a user to enter a command to control operation of the mobile terminal 100. The inputted content may be set in a variety of ways. For example, the rear input unit may be used by the user to enter a command, such as power on / off, start, end, scroll or the like, control of a volume level outputted from the first or second audio output module 152a or 152b, switching to a touch recognition mode of the display unit 151, or the like. The rear input unit can be implemented in a form allowing a touch input, a push input or a combination thereof. The rear input unit may be arranged to overlap the display unit 151 of the front surface in a thickness direction of the terminal body. For example, the rear input unit may be disposed on an upper end portion of the rear surface of the terminal body such that a user can easily manipulate it using an index when the user grips the terminal body with one hand. However, the present invention may not be limited to this, and the position of the rear input unit may be changeable. When the rear input unit is disposed on the rear surface of the terminal body, a new user interface can be implemented using the rear input unit. Also, the aforementioned touch screen or the rear input unit can substitute for at least a portion of the functions of the first handling unit 123a located on the front surface of the terminal body. Therefore, when the first handling unit 123a is not disposed on the front surface of the terminal body, the display unit 151 may be implemented to have a larger screen. On the other hand, the mobile terminal 100 may include a digital scanning sensor that scans a user's fingerprint. The controller may use fingerprint information detected by the digital scanning sensor as an authentication means. The digital scanning sensor may be installed in the display unit 151 or the user input unit 123. The microphone 122 may be formed to receive the user's voice, other sounds, and the like. The microphone 122 may be provided at a plurality of locations, and configured to receive stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to exchange data with external devices. For example, the interface unit 160 may be at least one of a connection terminal for connecting to another device (e.g., a headset, an external speaker, or the like), a port for communication in the near field (e.g., an IrDA port, a Bluetooth port, a wireless LAN port, and the like), or an electrical power supply terminal for supplying electrical power to the mobile terminal 100. interface 160 may be implemented as a socket for housing an external card, such as a subscriber identification module (SIM), or a user identification module (UIM), or a card memory for storing information. The second image capture apparatus 121b may further be mounted on the rear surface of the terminal body. The second image capture apparatus 121b may have an image capture direction which is substantially opposite to the direction of the first image capturing unit 121a. The second image capture apparatus 121b may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. View capturing devices may be referred to as "networked image capture devices". When the second image capture apparatus 121b is implemented as a networked image capture apparatus, images can be captured in various ways using the plurality of lenses and images with better qualities can be obtained. A flash 124 may be disposed adjacent to the second image capture apparatus 121b. When an image of a subject is captured with the image capture apparatus 121b, the flash 124 may illuminate the subject. The second audio output module 152b may further be disposed on the terminal body. The second audio output module 152b can implement stereophonic sound functions together with the first audio output module 152a (see FIG. 1A), and can also be used to implement a speakerphone mode for call communication. At least one antenna for wireless communication may be disposed on the terminal body. The antenna can be installed in the terminal body or formed on the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 (see Fig. 1A) may be retractable into the terminal body. Alternatively, an antenna may be in the form of a film for attachment to an inner surface of the back cover 103 or a housing including a conductive material may serve as an antenna. An electric power supply unit 190 for supplying electrical power to the mobile terminal 100 may be disposed on the terminal body. The power supply unit 190 may include a battery 191 which is mounted in the terminal body or releasably coupled to an outside of the terminal body. The battery 191 can receive electrical energy via an electrical power source cable connected to the interface unit 160. Also, the battery 191 can be (re) chargeable wirelessly using a wireless charger. The wireless charge can be implemented by magnetic induction or electromagnetic resonance. On the other hand, the drawing shows that the rear cover 103 is coupled to the rear housing 102 to protect the battery 191, to prevent separation of the battery 191 and to protect the battery 191 from external impact or foreign matter. . When the battery 191 is separable from the terminal body, the rear housing 103 can be separably coupled to the rear housing 102. An accessory for protecting an appearance or assisting or improving the functions of the mobile terminal 100 may also be provided on the mobile terminal 100. As an example of the accessory, a cover or cover for covering or housing at least one surface of the mobile terminal 100 may be provided. The lid or pouch may cooperate with the display unit 151 to enhance the function of the mobile terminal 100. Another example of the accessory may be a touch pen to assist or enhance a touch input on a touch screen. A mobile terminal according to the present invention can realize wireless communication with a remotely controlled aircraft to control a function of the remotely operated aircraft. Here, the remotely controlled aircraft may be an aircraft that does not use an airstrip, in which various functions, such as the transport of a thing, the capture of an image, the reconnaissance search at low altitude, can be performed on a small body formed in a relatively light weight. A mobile terminal according to the present invention can form a control command to control the flight of the unmanned aircraft, and form a control command for controlling a view capturing apparatus configured to capture an external environment during the flight among various electronic elements mounted on the remotely operated aircraft. Hereinafter, a method of controlling the control of various functions of the remotely operated aircraft using the mobile terminal will be described. Fig. 2A is a flowchart for explaining a method of controlling a mobile terminal according to an embodiment of the present invention, and Fig. 2B is a conceptual view for explaining the control method of Fig. 2A. When an application associated with the image capture apparatus 121 is performed, a capture function of the image capture apparatus 121 is activated, and an external environment is imaged by the image capture apparatus 121 (S210 ). The display unit 151 displays a preview image 511 of the external environment (S220). Here, a first preview image 511 corresponds to an image acquired in real time by the image capture apparatus 121. A current capture enlargement of the image capture apparatus 121 may be displayed on the first preview image 511 The preview image 511 is changed in real time by a change in the capture distance, the external environment, and the like, of the image capture apparatus 121. The controller 180 performs a zooming function. to close-up / zoom to general plane of the camera 121 by a specific type of touch input applied to the touch screen 151, thereby displaying a preview image having a specific enlarged / reduced region. For example, the specific type of touch may correspond to a touch input in which two touch points applied to the touch screen are moving toward or away from each other (pinch-to-pull / pinch-to-pinch). Referring to Fig. 2B, a capture magnification of the picture capturing apparatus 121 is changed by the specific type of touch (recoil / gap nip) applied to the touch screen when the first preview image 511 captured by a specific magnification of the image capture apparatus 121 is displayed thereon (S230). Therefore, the first preview image 511 is switched to a second preview image 512. The second preview image 512 is an image acquired at a smaller magnification (x 1.0) than that of the first preview image 511. The first preview image 511 may include an image for a larger external environment. The controller 180 controls the display unit to display the second preview image 512 when a type of close-in pinch is applied to the touch screen displayed with the first preview image 511, and displays the first image of preview 511 again when a type of gap nip is applied to the touch screen displayed with the second preview image 512. The display unit 151 may display a capture icon 612 receiving a touch to capture and storing a preview image and a gallery icon 613 to display images previously stored in conjunction with the preview image. When a touch is applied to the capture icon 612, the controller 180 controls the memory 170 to store a preview image at a point at which touch is applied thereto. In addition, when the preview image is stored, the preview image is displayed on the gallery icon 613. Therefore, the user can know which preview image was stored by a touch of the capture icon 612. Meanwhile, after touching the capture icon 612, a preview image acquired by the picture capturing apparatus can be displayed continuously on the touch screen or a storage image can be be displayed on the touch screen. The controller 180 can continuously monitor a capture magnification of the image capture apparatus 121 based on a touch pad in which the close-in pinching and the nip-pinching types are applied. A specific function associated with the image capture apparatus 121 is activated according to the touch applied in a state in which a preview image acquired at a preset capture magnification is displayed (S240). Here, the preset capture magnification corresponds to the minimum capture magnification of the image capture apparatus 121 mounted on the mobile terminal 100. The specific function associated with the image capture apparatus 121 may be set by a user. Referring to FIG. 2B, the specific function corresponds to a division function of the touch screen into a plurality of capture control regions. The controller 180 divides the touch screen 151 into first and second capture control regions (A1, A2) according to a type of close-in pinch applied while the second preview image 512 is displayed on the touch screen 151 because of the minimum capture magnification of the image capture apparatus 121. A first thumbnail image 512a of the second preview image 512 acquired by the image capture apparatus 121 is displayed on the first and second capture control regions (A1, A2). In other words, the first and second capture control regions (A1, A2) display the same preview image acquired by the image capture apparatus 121. The first and second capture control regions (A1, A2) ) may be formed with substantially the same area, but may not necessarily be limited to this. For example, the sizes of two capture control regions may vary depending on a region on which the specific type of touch is applied, a range of the specific type of touch, and the like. When the specific type of touch (type of close-in pinch) is reapplied to a state in which the first thumbnail image 512a is displayed in the first and second capture control regions (A1, A2), respectively, the Controller 180 divides the touch screen 151 into capture control regions, one-third to one-sixth, (B1, B2, B3, B4). The touch screen 151 displays a second thumbnail image 512b of the second preview image on the capture control regions, from the third to the sixth, (B1, B2, B3, B4). When the second preview image acquired by the camera 121 is changed, the controller 180 controls the touch screen 151 to change the second thumbnail images 512b, respectively, to be changed in the same manner. The sizes of the capture control regions, from the third to the sixth, (B1, B2, B3, B4) can be divided into substantially the same size, but may not necessarily be limited to this. The controller 180 only controls a preview image displayed in the capture control region independently according to a control command applied to each of the divided capture control regions. Figs. 3A to 3C are conceptual views for explaining a method of controlling the notification of the execution of a function associated with a picture capture apparatus 121 as a function of a touch. According to the present invention, a specific type of touch to perform a function associated with the image capture apparatus 121 is substantially the same as a specific type of touch to reduce the magnification of the image capture apparatus 121. However, according to the present embodiment, the execution of a function associated with the image capture apparatus 121 is notified to the user by the same type of specific touch. Referring to FIG. 3A, the touch screen displays the first preview image 511. The controller 180 decreases the capture magnification of the image capture apparatus 121 based on a specific type of touch applied to the camera. first preview image 511, and displays the second resultant preview image 512. The controller 180 adjusts the capture magnification of the image capture apparatus 121 according to a range of touch. When a number of touches increases, a change in the changed capture magnification may increase. When the touches are detected continuously after switching the capture magnification to the minimum capture magnification, the controller 180 controls the touch screen 151 to display a first flag 601 indicating the execution of a specific function associated with the capture. 121. The first indicator 601 may be formed along an edge of the touch screen 151 and displayed to surround the second preview image 512. When the specific type of touch is additionally or continuously applied in a state in which the first flag 601 is displayed, the controller 180 divides the touch screen 151 into a preset number of capture control regions (B1, B2, B3 B4). Substantially the same second thumbnail image 512b is displayed on the capture control regions, from the third to the sixth, (B1, B2, B3, B4). In other words, when a touch is additionally applied, by the first indicator 601, the user can know that a preset function of the image capture apparatus 121 is realized. Therefore, the user can apply a touch to divide the touch screen into capture control regions or adjust the capture magnification thereof. Referring to FIG. 3B, the touch screen 151 displays the first preview image 511. The controller 180 decreases the capture magnification of the image capture apparatus 121 depending on the specific type of touch applied to the first preview image 511. The second preview image 512 is displayed according to a change of the capture magnification of the image capture apparatus 121. The second preview image 512 corresponds to an image acquired according to the image. minimum capture magnification. Here, the specific type of touch corresponds to a close-in pinch input. When the specific type of touch is applied in a state in which the second preview image 512 is displayed, the controller 180 separates the touch screen 151 into the capture control regions, from the third to the sixth, (B1, B2 , B3, B4) after a preset period (t). Here, the preset period (t) can correspond to several seconds. In other words, when a control command (close-in pinch input) to reduce the capture magnification is applied after the picture capture apparatus 121 is set to the minimum capture magnification, the Controller 180 stops operation for a preset period (t) to allow the user to distinguish the reduction of the capture magnification and the separation of the touch screen. When another control command (e.g., cancel, touch toe nip, etc.) is applied within the preset period (t), the controller 180 does not separate the touch screen 151. Meanwhile, the controller 180 may display a vibration before the preset period (t) to notify of the separation of the touch screen 151. The second thumbnail images 512b are displayed on the capture control regions, from the third to the sixth, (B1, B2, B3, B4), respectively. Referring to FIG. 3C, when the specific type of touch (close-in pinch input) is applied in a state in which the second preview image 512 is displayed, the controller 180 controls the touch screen 151 to display a notification window 611. The notification window 611 may include verification information to verify whether or not a mode has been performed, a graphical image, and the like. The controller 180 can separate the touch screen 151 based on a touch applied to the notification window 611 to display a reduced preview image on each capture control region. Further, the controller 180 may perform a function using the second relevant thumbnail image 512b based on a touch applied on the capture control regions, from the third to the sixth, (B1, B2, B3, B4). On the other hand, the execution of a function can be suppressed via the notification window 611. When the user does not wish the execution of the function, he can suppress the execution of the function by via the notification window 611. In this case, the touch screen 151 continuously displays the second preview image 512. According to the present embodiment, when the capture magnification is reduced or a specific function is executable in the same touch mode, the user can distinguish two functions, thus improving the convenience of the user. Figs. 4A to 4C are conceptual views for explaining a control method of capturing control region control independently. Referring to Fig. 4A, the touch screen 151 displays the second thumbnail image 512b in the capture control regions, from the third to the sixth, (B1, B2, B3, B4), respectively. The capture control regions, from the third to the sixth, (B1, B2, B3, B4), respectively, receive one touch. When a touch is applied to the capture control regions, from the third to the sixth, (B1, B2, B3, B4), the controller 180 captures and stores the second thumbnail image 512b displayed in the third control region of capture (Bl). The controller 180 displays a gallery icon 613 corresponding to the second thumbnail image 512b, and continuously displays the second thumbnail image 512b in the third capture control region (B1). While the second thumbnail image 512b is continuously displayed in the third capture control region (B1), another preview image 513 currently detected by the image capture apparatus 121 is displayed in the capture control regions, from the fourth to the sixth, (B2, B3, B4). In addition, a graphical image for performing a function (eg deletion, sharing, etc.) for the second stored thumbnail image 512b can be displayed in the third capture control region (B 1). The controller 180 may apply a control command on the fourth to sixth capture control regions (B2, B3, B4) independently of the third capture control region (B1). Although not shown in the drawing, when a touch is applied to the fourth capture control region (B2), the other preview image 513 can be captured and stored. In other words, the user can apply a control command on the divided region to perform a capture function at a different time, and perform another capture while continuously recognizing previously captured images. Referring to Fig. 4B, the touch screen 151 displays the second thumbnail image 512b in the capture control regions, from the third to the sixth, (B1, B2, B3, B4), respectively. The capture control regions, from the third to the sixth, (B1, B2, B3, B4), respectively, receive one touch. The controller 180 performs a video capture function for storing images acquired by the image capture apparatus 121, according to the lapse of time, according to a specific type of touch applied to the third capture control region ( bl). Here, the specific type of touch input can correspond to a long touch input applied during a specific period. The controller 180 controls the image capture apparatus 121 and the memory 170 to capture video from a time when the specific type of touch input is received. A capture bar 602 is displayed on the third capture control region (B1). The capture bar 602 indicates a capture time during which a video made of images is captured. The controller 180 may capture video for a preset period of time depending on the specific type of touch. While the video capture is performed, the capture control regions, from the third to the sixth, (B1, B2, B3, B4) display substantially the same preview image. Alternatively, when the specific type of touch is applied to another capture control region, other than the third capture control region (B1), for example, the fourth capture control region (B2), the controller 180 terminates the video capture in the third capture control region (B1), and performs a video capture in the fourth capture control region (B2). In this case, the third capture control region (B1) may display a preview image 513 corresponding to the captured video file, and display a graphics image to control the video file. In addition, the touch screen 151 is controlled to display the gallery icon 613 including the preview image 513. The controller 180 controls the touch screen 151 to display the capture bar 602 in the fourth capture control region (B2). While video capture is performed by the fourth capture control region (B2), a preview image 514 displayed in the capture control regions, fourth through sixth, (B2, B3, B4) is substantially the same. even. Although not shown in the drawing, the controller 180 may terminate the capture based on a touch applied to the fourth capture control region (B2). According to the present embodiment, the user can form a plurality of video files, from its desired instant, and the capture time of the video file can overlap another. As a result, the user can capture a plurality of videos with various start and end points at the same time. A video capture control control method according to various embodiments will be described with reference to FIG. 4C. When a specific type of touch (touch-to-close pinch input) is applied in a state in which the second preview image 512 is displayed, the controller 180 divides the touch screen 151 into the capture control regions, the third to sixth, (B1, B2, B3, B4) to display the second thumbnail image 512b in each capture control region. When a touch is applied to the capture icon 612, in a state in which the second thumbnail image 512b is displayed in the capture control regions, from the third to the sixth, (B1, B2, B3, B4) , the controller 180 performs a video capture. The controller 180 controls the image capture apparatus 121 and the memory 170 to capture video with different start and end points for preview images displayed in the third to sixth capture control regions, ( B1, B2, B3, B4). However, the capture time can be set by a user, or determined by a control command applied additionally. From capture bars, from a first to a fourth, 602a, 602b, 602c, 602d indicating different flows of time are displayed on the capture control regions, from the third to the sixth, (B1, B2, B3, B4 ). A video capture using an image displayed in the third capture control region (B1) is performed first, and completed first. The memory 170 can store four video files captured at different times. Part of a plurality of images constituting each video file may overlap another. When the capture is complete in each capture control region, a preview image acquired by the image capture apparatus 121 is not displayed. When the capture is complete, a representative image of the video file can be displayed on the gallery icon 613. According to the present embodiment, a plurality of video files captured at different times can be formed according to a touch applied to the capture icon. Alternatively, the controller 180 controls the image capture apparatus 121 and the memory 170 to consecutively capture images at different times according to a touch applied to the capture icon 612. Therefore, the user can receive images captured consecutively by each capture control region. Figs. 5A to 5C are conceptual views for explaining a control method of controlling a preview image displayed on a capture control region. Referring to Fig. 5A, when a specific type of touch (close-in pinch input) is applied to the second preview image 512, the controller 180 separates the touch screen 151 into the capture control regions, from the third to the sixth, (B1, B2, B3, B4). The capture control regions, from the third to the sixth, (B1, B2, B3, B4), respectively, display the second thumbnail image 512b of the second preview image 512. When the specific type of touch (touch-to-close pinch input) is applied in a state in which the capture control regions, from the third to the sixth, (B1, B2, B3, B4) are separated, the controller 180 controls the touch screen 151 to display a view selection window 603. The touch screen 151 displays a second preview image 512 together with the view selection window 603 based on the specific type of touch. The view selection window 603 may include a plurality of graphic images indicating a shape in which the touch screen 151 is separated. The controller 180 separates the touch screen according to a touch applied to one of the graphic images. For example, the touch screen can be separated into seventh and eighth capture control regions (C1, C2). The seventh and eighth capture control regions (C1, C2), respectively, may display a third thumbnail image 512c of the second preview image 512. In other words, the user can separate the touch screen 151 into various shapes via an additional touch input. Figure 5B illustrates a method of controlling a preview image displayed in the capture control region. The controller 180 can separate the touch screen 151 into the third to sixth capture control regions (B1, B2, B3, B4) according to a specific type of touch (touch-to-close pinch input). ) applied to the second preview image 512. The capture control regions, from the third to the sixth, (B1, B2, B3, B4), respectively, display a second thumbnail image 512b of the second preview image 512. The controller 180 performs a function of modifying a preview image displayed in each capture control region according to the specific type of touch. For example, the controller 180 selects one of a plurality of capture control regions based on a touch applied to the touch screen 151. For example, when a touch is applied to the fourth capture control region (B2), the touch screen 151 displays a modification selection window 604. The modification selection window 604 may include a plurality of graphic images receiving a touch to modify the preview image or to apply a visual effect to the preview image. -this. The touch screen 151 displays the selected capture control region while the change selection window 604 is displayed. As shown in FIG. 5B, the fourth capture control region (B2) can be displayed in a highlighted manner. When a touch is applied to the plurality of graphic images, the controller 180 applies a visual effect to the second thumbnail image 512b of the fourth capture control region (B2) according to the selected graphic image. The fourth capture control region (B2) displays a modification image 512b 'of the second thumbnail image. The controller 180 stores the modification image 512b ', and controls the touch screen 151 to display the gallery icon 613 indicating the modification image 512b'. While the modification image 512b 'is displayed in the fourth capture control region (B2), the third, fifth and sixth capture control regions (B1, B3, B4) display a preview image 513 acquired by the 121 view capture apparatus. The controller 180 controls the touch screen 151 to capture a preview image based on a touch additionally applied to the third, fifth, and sixth capture control regions (B1, B3, B4) or captures the preview image in a modified way. According to the present embodiment, the user can capture an image displayed in each capture control region while, at the same time, applying a visual effect on the image to store it, and receive a time-acquired preview image. together with the image on which the visual effect is applied. A method of revising the selected preview image will be described with reference to FIG. 5C. The touch screen 151 may be separated into the third to sixth capture control regions (B1, B2, B3, B4), and the controller 180 may control the image capture apparatus 121 and the memory for storing the image or forming a video file based on a touch applied to the fourth capture control region (B2). While the second thumbnail image 512b stored in the fourth capture control region (B2) is displayed, the third, fifth, and fifth capture control regions (B1, B3, B4) display the preview image 513 acquired by the 121 view capture apparatus. The controller 180 controls the touch screen 151 to display a revision screen 521 based on a touch applied to the fourth capture control region (B2). The revision screen 521 may include the second thumbnail image 512b, a toolbar 521a, and a region display portion 521b. The toolbar 521a may include a plurality of revision icons to revise the second thumbnail image 512b. The region display portion 521b displays a divided form of the third, fifth, and sixth capture control regions (B1, B3, B4) and a region of the fourth capture control region (B2) displayed with the second thumbnail image 512b. In the present embodiment, a previously stored image may be revised, and a user may know a location at which the selected image is displayed on the split touch screen on a review screen. Figs. 6A-6C are conceptual views for explaining a method of controlling the execution of a function associated with an external device. Referring to FIG. 6C, the controller 180 controls the wireless communication unit to perform wireless communication with an external device 100 'according to a specific type of touch applied in a state in which the second image of preview 512 is displayed. Here, the external device 100 'may correspond to a mobile terminal equipped with a picture capturing apparatus, but may not necessarily be limited to this. The external device 100 'can be specified by the user setting, or set by the selection by the user, among the external devices 100' whose search is performed around the mobile terminal 100. When connected to the wireless external device 100 ', the controller 180 separates the touch screen 151 into the first and second capture control regions (A1, A2). The touch screen 151 displays the revision screen 521 in the first capture control region (A1). The touch screen 151 may display a region of the revision screen 521 or change the size of the revision screen 521 to display it in the first capture control region (A1). The controller 180 controls the touch screen 151 to receive a third preview image 531 acquired by the image capturing apparatus of the external device 100 ', and display the third preview image 531 in the second capture control region (A2 ). The controller 180 forms a capture control command based on a touch applied to the second capture control region (A2). The wireless communication unit transmits the capture control command to the external device 100 ', and receives an image captured by the image capture apparatus of the external device 100'. The controller 180 stores the image in the memory 170, and controls the touch screen 151 to display the storage image on the gallery icon 613. While the image is being stored, the image capturing apparatus 121 and the image capture apparatus of the external device 100 'can continuously form images of an external environment. The first and second capture control regions (A1, A2) display the new preview image received in real time 522, 523. Although not shown in the drawing, the controller 180 can store the second preview image 521 via a touch applied to the first capture control region (A1). Further, the controller 180 may control the third preview image 531 to be stored in the memory of the external device 100 'according to a touch applied to the second capture control region (A2). According to the present embodiment, the user can receive an image acquired by the image capture apparatus of the external device 100 'at the same time, and store an image captured by the image capture apparatus of the external device 100' in the mobile terminal. Although a mobile terminal is connected to an external wireless device 100 'in Figure 6A, the present invention may not necessarily be limited thereto. For example, when the specific type of touch (touch-to-close pinch input) is applied to the first and second capture control regions (A1, A2), the controller 180 can separate the touch screen 151 into three or more Capture control regions and control to achieve wireless communication with two or more external devices. Referring to Fig. 6B, the first and second capture control regions (A1, A2) on the display unit display the second preview image 521 acquired by the image capture apparatus 121 of the mobile terminal 100 and the third preview image 531 acquired by the image capturing apparatus of the external device 100 ', respectively. When the external device 100 'is a mobile terminal, the display unit of the external device 100' is divided into first and second regions to display the third preview image 531 and the second preview image 521, respectively. The controller 180 forms a control command to control the image capture apparatus 121 and the external device capture apparatus 100 'according to a touch applied to the capture icon 612. The controller 180 displays the second and third preview images 521, 531 captured by the image capture apparatus 121 and the external device capture apparatus 100 'in the first and second capture control regions (A1, A2). On the other hand, the display unit of the external device 100 'can also continuously display the second and third preview images 521, 531. In other words, the external device 100' can be controlled according to the capture control command. According to the present embodiment, the user can capture and store images currently displayed on the touch screen 151 simultaneously. Figure 6C is a conceptual view for explaining a method of controlling the association of images using an external device. The touch screen 151 displays a fourth preview image 515, and receives a specific type of touch. Here, the specific type of touch corresponds to a close-in pinch input, and the fourth preview image 515 corresponds to an image captured at the minimum capture magnification by the image capture apparatus. The controller 180 performs wireless communication with the external device 100 'according to the specific type of touch applied to the fourth preview image 515. The controller 180 separates the touch screen into seventh and eighth capture control regions (C1, C2), by the specific type of touch, and displays the fourth preview image 515 acquired by the image capture apparatus 121 on the seventh capture control region (C1) and displays a fifth preview image 532 captured by the external device capture apparatus 100 'in the eighth capture control region (C2). The fourth and fifth preview images 515, 532 may include different screen information. The controller 180 captures the fourth and fifth preview images 515, 532 based on a touch applied to the capture icon 612, and associates the fourth and fifth preview images 515, 532 to form a panorama image 516. Although not shown in detail in the drawing, the controller 180 can capture consecutive images according to the motion of the mobile terminal 100 and the external device 100 'after applying a touch on the capture icon 612. According to the present embodiment, a plurality of images captured by different external devices may be associated to form a panorama image. Figs. 7A and 7B are conceptual views for explaining a control method for controlling a change in a capture mode. Referring to Fig. 7A, a mode selection screen 614 for changing a capture mode is displayed according to a specific type of touch (close-in pinch input) in a state in which the fourth preview image 515. The mode selection screen 614 may include a plurality of graphic images corresponding to a plurality of modes. For example, the touch screen can be separated into a plurality of regions depending on the specific type of touch to display the graphic image. The controller 180 may select a capture mode based on a touch applied to any one of the plurality of regions. For example, when a panorama capture mode is selected, the touch screen displays a preview image captured by the image capture apparatus 121, and displays a 605 icon associated with the panorama capture mode. When the capture mode is selected, the controller 180 controls the memory 170 to store the captured preview image. In other words, the picture capturing apparatus 121 is controlled to capture an image in the selected capture mode according to a touch applied to one of the plurality of separate regions. Although not shown in the drawing, when a specific touch is additionally applied to the mode selection screen 614, the controller 180 separates the touch screen into a larger number of regions. An additional capture mode may correspond to separate regions in addition. In addition, the touch screen may display a modification image on which the preview image is applied to a capture control region corresponding to each capture mode. According to the present embodiment, a capture operation can be performed in a desired capture mode more quickly without any additional method for selecting a capture mode. Referring to FIG. 7B, the controller 180 separates the touch screen 151 into a plurality of regions according to a specific type of touch (touch-to-close pinch input) applied in a state in which the touch screen displays the second preview image 512. The second preview image 512 corresponds to an image acquired at the minimum capture magnification of the image capture apparatus 121. The touch screen 151 displays the second preview image 512 in the plurality regions, respectively. Different visual effects are applied to the second preview image 512 displayed in the plurality of different regions. The controller 180 controls the touch screen 151 to apply different visual effects to images acquired by the image capture apparatus 121 to display them. The touch screen 151 may apply a plurality of visual effects to images recognized by the image capture apparatus 121 to display them. The controller 180 controls the memory 170 to store a second preview image 512c on which the visual effect is applied according to a touch applied to the plurality of regions. The touch screen 151 displays the second preview image 512c on which the selected visual effect is applied in the set according to the touch. Alternatively, the controller 180 controls the memory 170 to store the second preview image 512c on which the visual effect is applied according to a touch applied to one of the plurality of regions. In addition, the controller 180 controls the touch screen 151 to display the gallery icon 613 indicating the second preview image 512c on which the visual effect is applied. In this case, the touch screen re-displays a preview image acquired by the camera 121. In addition, although not shown in the drawing, when a touch is applied to the plurality of regions, the touch screen stores a preview image on which the selected visual effect is applied and displays the stored preview image. In this case, the touch screen may apply the other visual effect to a preview image in which the other region is recognized by the image capture apparatus 121 to display it. Figs. 8A and 8B are conceptual views for explaining a control method of touch screen separation control 151. Referring to Fig. 8A, the touch screen 151 displays a sixth preview image 516 on the first and second capture control regions (A1, A2), respectively. The sixth preview image 516 corresponds to an image captured at the minimum capture magnification of the image capture apparatus 121. When a specific type of touch (close-in pinch input) is applied to the first region of capture control (A1), the controller 180 divides the first capture control region (A1) into a plurality of regions. The thumbnail image 516a of the sixth preview image 516 is displayed in the divided plurality of regions, respectively. The controller 180 may capture a preview image displayed according to a touch applied to the plurality of regions. On the other hand, the controller 180 separates the first capture control region (A1) into a plurality of regions depending on the specific type of touch applied to the first capture control region (A1). The plurality of regions display first and second modification images 516b, 516c on which different visual effects are applied on the sixth preview image 516. Therefore, the controller 180 may apply an additional touch to each region of the touch screen 151 that has been separated into a plurality of regions to perform a new function. Referring to Fig. 8B, when the specific type of touch (close-in pinch input) is applied to the first capture control region (A1) in a state in which the touch screen 151 is separated into the first and second capture control regions (A1, A2), the controller 180 controls the touch screen 151 to reduce the first capture control region (A1) and to enlarge the second capture control region (A2). Referring to FIG. 8C, the controller 180 can change the size of the first and second capture control regions (A1, A2) according to a touch applied to a boundary between the first and second capture control regions ( Al, A2). Figs. 9A-9C are conceptual views for explaining a control method for controlling a front view capturing apparatus. Referring to Fig. 9A, when the specific type of touch (close-in pinch input) is applied in a state in which a seventh preview image 517 is displayed, the controller 180 activates the front-view capture apparatus . The seventh preview image 517 corresponds to an image acquired at the minimum capture magnification of the image capture apparatus 121. The touch screen 151 is separated into the first and second capture control regions (A1, A2) , and the seventh preview image 517 is displayed on the first capture control region (A1) and an eighth preview image 518 acquired from the front view capturing apparatus is displayed on the second capture control region. (A2). In other words, the controller 180 may further activate a picture capturing apparatus according to the specific type of touch, and may receive pictures acquired by different picture capturing apparatuses at the same time. Referring to Fig. 9B, when the specific type of touch is applied in a state in which the seventh preview image 517 is displayed, the controller 180 separates the touch screen into the first and second capture control regions (Al , A2), respectively. The controller 180 activates the front view capturing apparatus according to a touch applied to the second capture control region (A2), and controls the display unit to display the eighth preview image 518 acquired by the front view capture apparatus in the second capture control region (A2). Here, the touch can correspond to a tactile input pull type applied in one direction. Although not shown in the drawing, when a touch input pull type is applied to the second capture control region (A2) in a different direction, the controller 180 controls the touch input to display the seventh preview image 517 again. The controller 180 switches a capture mode of the image capture apparatus 121 according to the touch applied to the second capture control region (A2). Here, the touch corresponds to a pull type of touch applied in one direction. For example, the controller 180 may control the image capture apparatus 121 to change a viewing angle of the image capture apparatus 121 depending on the type of touch shot. In other words, the controller 180 can form a control command to activate a wide angle capture mode. The second capture control region (A2) displays a preview image 518 'captured in the wide-angle capture mode. In the present embodiment, the controller 180 may apply a touch on the divided capture control region to control the activation of a picture capture apparatus or to change the capture mode of the picture capture apparatus. Referring to Fig. 9C, the touch screen 151 is separated into the first and second capture control regions (A1, A2) to display the seventh preview image 517 and the eighth preview image 518, respectively, captured by the image capture apparatus 121 in each capture control region. The controller 180 separates the touch screen 151 into the third to sixth capture control regions (B1, B2, B3, B4) according to a specific type of touch (close-in pinch input). applied on the touch screen 151. Two touch points in the close-in pinch mode correspond to the first and second capture control regions (A1, A2), respectively. The touch screen 151 displays a thumbnail image 517a of the seventh preview image 517 on the first and second control regions (B1, B2). In addition, the touch screen 151 displays a thumbnail image 518a of the eighth preview image 518 on the fifth and sixth capture control regions (B3, B47. The controller 180 captures images displayed in the fourth and sixth capture control regions (B2, B4) or forms a video file according to a touch applied to the fourth and sixth capture control regions (B2, B4). The controller 180 controls the touch screen 151 to display the capture bar 602 indicating a capture time of the image while the video file is generated. In other words, according to the present embodiment, it may be possible to form a video file or to store images while checking preview images acquired by a plurality of image capturing devices. Figs. 10A to 10C are conceptual views for explaining a method of controlling the capture of an image. Referring to Fig. 10A, when the specific type of touch (close-in pinch input) is applied during a video capture, the controller 180 controls the memory 170 to store a preview image at a time when the specific type of touch is applied. In other words, the controller 180 captures an image displayed on the touch screen according to the specific type of touch. The touch screen displays the captured image on the 613 gallery icon. Referring to Fig. 10B, the touch screen 151 displays the video file playback screen. When the specific type of touch (touch-to-close pinch input) is applied to the read screen, the controller 180 controls the memory 170 to store screen information at a time when a specific type of touch is applied. The touch screen 151 displays an icon 513 'corresponding to the captured screen. The controller 180 may display a captured image according to a touch applied to the icon 513 '. Referring to Fig. 10C, the touch screen 151 displays the play screen 620 of a video file, and displays a play bar 519a indicating a playing time of the video file. The controller 180 displays a storage section portion 519b on the read bar 519a based on a touch applied to the read screen 620. When a touch is applied to the storage section portion 519b, the controller 180 controls the touch screen 151 to display a check window 805 to verify whether or not the section should be separately stored. In the present embodiments, a close-in pinch input may be applied while capturing a video or playing a video file to separately store an image at the relevant time. As a result, the user can immediately store an image separately without stopping video playback to capture the image. Figs. 11A-11C are conceptual views for explaining a method of controlling the display of an image previously stored. Referring to Fig. 11A, when the specific type of touch is applied in a state in which the seventh preview image 517 is displayed, the controller 180 controls the touch screen to display a storage image 641 previously stored in the memory. The storage image 641 may correspond to an image captured by the image capture apparatus 121 or an image stored from a server. The controller 180 displays screen information 642 including a plurality of storage images as a function of the specific touch input applied to the storage image 641. The screen information 642 may include a plurality of storage images sequentially arranged 642a and a control icon 642b to control (eg, share, delete, copy, revise, etc.) the storage images 642a. In the present embodiment, when a view capture apparatus is enabled to display a preview image, a specific touch input may be applied to provide storage image files, and the type of image files provided. can be set by the user. Referring to Fig. 11B, the touch screen displays a ninth preview image 519 acquired by the image capture apparatus 121. The controller 180 analyzes an object included in the ninth preview image 519 based on a specific type of touch (close-in pinch input) applied to the ninth preview image 519. The controller 180 collects information associated with the object. For example, the controller 180 may search information associated with the object to receive the information from a specific server or search information associated with the object from the memory 170. The touch screen 151 is separated into the first and second capture control regions (A1, A2) according to the specific touch, and the first capture control region (A1) can display the ninth preview image 519 and the second capture control region (A2). may display screen information 631 including information associated with the object. Alternatively, the touch screen 151 may display a graphical image 632 associated with the information on the ninth preview image 519. In this case, the controller 180 may store the graphical image 632 and the ninth preview image 519 at the same time. function of a capture command applied by the user. According to the present embodiment, the information of an object included in a preview image can be checked according to a specific touch input before storing the preview image. In addition, the preview image can be stored together with the information. Referring to Fig. 11C, the controller 180 may analyze a shape included in the tenth preview image 520 according to the specific type of touch (close-in pinch input) applied to the tenth preview image 520. The controller 180 can select an object between two touch positions of a close-in pinch input applied to the tenth preview image 520. The controller 180 separates the touch screen 151 into the first and second capture control regions (A1, A2) according to the specific type of touch. The first capture control region (A1) displays the tenth preview image 520, and the second capture control region (A2) displays screen information 640 composed of at least one image associated with the selected object. The controller 180 can extract at least one image including the selected object from the memory 170. Otherwise, the controller 180 can receive at least one image associated with the object from a specific server. For example, the controller 180 may recognize a shape included in the tenth preview image 520, and extract an image including the same shape. However, the selected object may not necessarily be limited to a shape. According to the present embodiment, the user can receive images associated with information included in a preview image before storing the preview image. Figs. 12A-12C are conceptual views for explaining a function associated with a view capturing apparatus performed by a specific control command. Referring to Fig. 12A, the touch screen displays the tenth preview image 520 acquired at the minimum capture magnification of the picture capture apparatus 121. The controller 180 controls the touch screen 151 to display a plurality of preview images 521, 523, 524, 525 tuned to different regions based on a specific type of touch (close-in pinch input) applied to the tenth preview image 520. When the specific type of touch is applied, the controller 180 obtains a number of objects included in the tenth preview image 520, and separates the touch screen 151 to correspond to the number. Referring to FIG. 12A, the controller 180 obtains four shapes included in the tenth preview image 520, and separates the touch screen 151 into the third to sixth capture control regions (B1, B2, B3, B4). The controller 180 displays first-to-fourth, 521, 523, 524, 525 focus images of the four shapes (objects), respectively, in the capture control regions, of the third at the sixth, (B1, B2, B3, B4), respectively. Although not shown in the drawing, the controller 180 can store a focus image, focused on each region, according to a touch applied to the capture control regions, from the third to the sixth, ( B1, B2, B3, B4). In the present embodiment, the user may first receive a plurality of preview images in which focus is formed on various regions included in a preview image to selectively store them. Referring to Fig. 12B, the controller 180 displays first to fourth focal shift images, 517c, 517d, 517e, 517f captured at different zoom magnifications in the capture control regions, the third to sixth, (B1, B2, B3, B4) according to a specific type of touch (close-in pinch input) applied to the seventh preview image 517 captured at the minimum capture magnification of the image capture apparatus 121. The controller 180 can control the touch screen 151 to display an indicator that the zoom images from the first to the fourth, 517c, 517d, 517e, 517f are captured. On the other hand, the controller 180 can selectively store only a zoom image on which a touch is applied according to a touch applied to the zoom images, from the first to the fourth, 517c, 517d, 517e, 517f. According to the present embodiment, the user can check a plurality of images according to different zoom magnifications simultaneously to store them selectively. Referring to Fig. 12C, a revision image 605 is displayed together with the seventh preview image 517 according to a specific type of touch applied to the seventh preview image 517. The revision image 605 can be formed along an edge region of the touch screen 151, and may include a plurality of icons. The controller 180 may modify the seventh preview image 517 based on a consecutively applied touch from an icon of the review image 605 on the seventh preview image 517. The touch screen 151 displays a modification image 517 'of the seventh preview image 517 depending on the touch. The controller 180 controls the touch screen 151 to allow the revision image 605 to disappear according to an opposite touch (e.g., nip-to-nip touch input) applied to the modification image 517 '. In addition, the controller 180 can control the memory 170 to store the modification image 517 'depending on the opposite touch. On the other hand, the controller 180 stores the modification image 517 ', and then controls the touch screen 151 to redisplay a preview image currently acquired by the image capture apparatus 121. According to the present embodiment, the user can immediately modify and store a preview image acquired in real time by the image capture apparatus. The present invention may be implemented in the form of computer-readable codes on a medium written by the program. Computer readable media may include all types of recording devices in which data readable by a computer system is stored. Examples of the computer readable media may include ROM, RAM, CD-ROM, magnetic tape, floppy disk, and optical storage device, and the like, and also include a device implemented as a carrier wave (for example, transmission via the Internet). In addition, the computer may include the controller 180 of the mobile terminal. Therefore, the detailed description of the present invention should not be construed as being restrictive in all respects but should be considered illustrative. The scope of the invention is to be determined by reasonable interpretation of the appended claims and any changes that are within the equivalent scope of the invention are included within the scope of the invention.
权利要求:
Claims (15) [1" id="c-fr-0001] A mobile terminal (100), comprising: a view capturing apparatus (121); a touch screen configured to display an image received from the image capture apparatus (121); and a controller (180) configured to change a capture magnification of the image capture apparatus (121) according to a specific type of touch applied to the touch screen, wherein the controller (180) activates a function specific to the image capture apparatus (121) according to the touch applied to the touch screen in a state in which an image acquired at a preset magnification is displayed. [2" id="c-fr-0002] The mobile terminal (100) of claim 1, wherein the preset magnification corresponds to the minimum capture magnification of the image capturing apparatus (121), and the magnification of the image capturing apparatus (121) decreases depending on the specific type of touch. [3" id="c-fr-0003] The mobile terminal (100) of claim 2, wherein when the specific type of touch is applied, the controller (180) divides the touch screen into a plurality of capture control regions (A1, A2), and the controller (180) controls screen information displayed in the plurality of capture control regions (A1, A2) independently according to a touch applied to the plurality of capture control regions (A1, A2). [4" id="c-fr-0004] The mobile terminal (100) according to claim 3, wherein the touch screen displays the image in the plurality of capture control regions (A1, A2) respectively, and the controller (180) stores the image according to a first touch applied to one of the plurality of capture control regions (A1, A2), and the touch screen displays another image acquired in real time by the image capture apparatus (121) in the another capture control region while displaying the image in a capture control region. [5" id="c-fr-0005] The mobile terminal (100) according to claim 4, wherein the controller (180) controls the image capture apparatus (121) to capture a video capture composed of images acquired by the image capture apparatus (121). ) according to the lapse of time as a function of a second touch applied to one of the plurality of capture control regions (A1, A2), and displays a capture bar (602) indicating a capture time of the capture video on a capture control region (A1, A2). [6" id="c-fr-0006] The mobile terminal (100) according to claim 3, wherein the controller (180) controls the touch screen to apply and display a different visual effect on each displayed image in the plurality of capture control regions (A1, A2) . [7" id="c-fr-0007] The mobile terminal (100) according to the claim, wherein the controller (180) displays a view selection window (603) for selecting the shape of a plurality of capture control regions (A1, A2) as a function of the specific type of touch, and the view selection window (603) comprises a number of separate capture control regions (A1, A2) and a plurality of icons having a different size of each capture control region (A1, A2). [8" id="c-fr-0008] The mobile terminal (100) according to any one of claims 1 to 7, further comprising: a wireless communication unit (110) configured to perform wireless communication with an external device (100 ') having a wireless communication device capturing images according to the specific type of touch, wherein the image and an image acquired by the external device capturing apparatus (100 ') are displayed in a plurality of capture control regions (A1, A2) ), respectively, of the touch screen separated by the specific type of touch. [9" id="c-fr-0009] The mobile terminal (100) according to the claim, wherein the controller (180) forms a panorama image (516) using a first image captured by the image capture apparatus and a second image captured by the image capture apparatus. capturing views of the external device (100 '). [10" id="c-fr-0010] The mobile terminal (100) according to any one of claims 1 to 9, further comprising: an additional view capture apparatus activated according to the specific type of touch, wherein the controller (180) displays first and second images acquired by the image capture apparatus (121) and the additional image capture apparatus, respectively, in first and second capture control regions (A1, A2) of the separate touch screen depending on the type specific to touch. [11" id="c-fr-0011] The mobile terminal (100) of claim 1, wherein the controller (180) stores an image displayed on the touch screen when the specific type of touch is applied while forming a video file with a plurality of images acquired by through the camera (121). [12" id="c-fr-0012] The mobile terminal (100) according to any one of claims 1 to 11, further comprising: a memory (170) configured to store an image captured by the image capture apparatus (121), wherein the controller ( 180) controls the touch screen to display a storage image previously stored in the memory (170) according to the specific type of touch. [13" id="c-fr-0013] The mobile terminal (100) according to claim 12, wherein the controller (180) extracts information from an object contained in the image according to the specific type of touch, and the storage image is associated with the information of an object contained in the image. [14" id="c-fr-0014] The mobile terminal (100) according to claim 12, wherein the controller (180) extracts information from an object contained in the image according to the specific type of touch, and receives screen information associated with the information of the object from a predetermined server, and the screen information and the image are displayed in the first and second capture control regions (A1, A2) of the touch screen. [15" id="c-fr-0015] A method of controlling a mobile terminal (100), the control method comprising: displaying an image received from a vision capture apparatus (121) on a touch screen; changing a capture magnification of the image capturing apparatus (121) according to a specific type of touch applied to the touch screen; and activating a specific function associated with the image capture apparatus (121) according to the touch applied to the touch screen in a state in which an image acquired at a preset magnification is displayed.
类似技术:
公开号 | 公开日 | 专利标题 FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3022368B1|2019-06-21|WATCH-TYPE TERMINAL AND CONTROL METHOD THEREOF FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL US10564675B2|2020-02-18|Mobile terminal and control method therefor US10154186B2|2018-12-11|Mobile terminal and method for controlling the same EP3163401B1|2019-07-03|Mobile terminal and control method thereof FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3021425A1|2015-11-27| FR3022649A1|2015-12-25| FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL FR3026201A1|2016-03-25| FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME US10365821B2|2019-07-30|Mobile terminal and control method for the mobile terminal FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3022367A1|2015-12-18| FR3046470B1|2019-11-08|MOBILE TERMINAL FR3031601A1|2016-07-15| FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME FR3040221A1|2017-02-24| FR3041447A1|2017-03-24| US20160054567A1|2016-02-25|Mobile terminal, glasses-type terminal, and mutual interworking method using screens thereof FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME FR3042084B1|2019-11-08|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
同族专利:
公开号 | 公开日 EP3125092A2|2017-02-01| KR20170014356A|2017-02-08| CN106412415A|2017-02-15| CN106412415B|2020-11-06| US10321045B2|2019-06-11| US20170034428A1|2017-02-02| EP3125092A3|2017-04-19|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JP4956988B2|2005-12-19|2012-06-20|カシオ計算機株式会社|Imaging device| KR100656589B1|2006-03-28|2006-12-13|삼성전자주식회사|Portable terminal capable of displaying and deleting division photograph pictures and method therof| EP2207342B1|2009-01-07|2017-12-06|LG Electronics Inc.|Mobile terminal and camera image control method thereof| CN101534413B|2009-04-14|2012-07-04|华为终端有限公司|System, method and apparatus for remote representation| KR101857564B1|2009-05-15|2018-05-15|삼성전자 주식회사|Method for processing image of mobile terminal| JP2011028345A|2009-07-22|2011-02-10|Olympus Imaging Corp|Condition change device, camera, mobile apparatus and program| US8239783B2|2009-11-19|2012-08-07|Microsoft Corporation|Integrated viewfinder and digital media| US8451994B2|2010-04-07|2013-05-28|Apple Inc.|Switching cameras during a video conference of a multi-camera mobile device| KR101719982B1|2010-07-19|2017-03-27|엘지전자 주식회사|Mobile terminal and method for controlling the same| KR101691833B1|2010-11-04|2017-01-09|엘지전자 주식회사|Mobile terminal and Method for controlling photographing image thereof| US20120289290A1|2011-05-12|2012-11-15|KT Corporation, KT TECH INC.|Transferring objects between application windows displayed on mobile terminal| US9153031B2|2011-06-22|2015-10-06|Microsoft Technology Licensing, Llc|Modifying video regions using mobile device input| CN103324329B|2012-03-23|2016-07-06|联想有限公司|A kind of method of toch control and device| US9223406B2|2012-08-27|2015-12-29|Samsung Electronics Co., Ltd.|Screen display control method of electronic device and apparatus therefor| KR20140029827A|2012-08-30|2014-03-11|삼성전자주식회사|Method for processing image and mobile device| EP2741201A3|2012-12-06|2017-05-17|Samsung Electronics Co., Ltd|Display device and method of controlling the same| CN103888423B|2012-12-20|2019-01-15|联想有限公司|Information processing method and information processing equipment| US9965877B2|2012-12-21|2018-05-08|Nokia Technologies Oy|Image processing apparatus and associated methods| KR102047703B1|2013-08-09|2019-11-22|엘지전자 주식회사|Mobile terminal and controlling method thereof| WO2015026101A1|2013-08-22|2015-02-26|삼성전자 주식회사|Application execution method by display device and display device thereof| KR102150905B1|2013-12-31|2020-09-02|삼성전자주식회사|Method for photographing based on WiFi Direct and electronic device performing thereof| US20160007008A1|2014-07-01|2016-01-07|Apple Inc.|Mobile camera system| US9998510B2|2015-03-20|2018-06-12|Walter Partos|Video-based social interaction system|JP5987931B2|2015-02-09|2016-09-07|株式会社リコー|Video display system, information processing apparatus, video display method, video display program, video processing apparatus, video processing method, and video processing program| KR20160131720A|2015-05-08|2016-11-16|엘지전자 주식회사|Mobile terminal and method for controlling the same| US20190174069A1|2016-03-18|2019-06-06|Kenneth L. Poindexter, JR.|System and Method for Autonomously Recording a Visual Media| US10225471B2|2016-03-18|2019-03-05|Kenneth L. Poindexter, JR.|System and method for autonomously recording a visual media| JP6833507B2|2016-12-27|2021-02-24|キヤノン株式会社|Imaging control device and its control method| JP6808480B2|2016-12-27|2021-01-06|キヤノン株式会社|Imaging control device and its control method| JP6765956B2|2016-12-27|2020-10-07|キヤノン株式会社|Imaging control device and its control method| JP6833505B2|2016-12-27|2021-02-24|キヤノン株式会社|Imaging control device and its control method| KR20190013308A|2017-08-01|2019-02-11|엘지전자 주식회사|Mobile terminal and method for controlling the same| CN108762849A|2018-05-31|2018-11-06|上海爱优威软件开发有限公司|The method of adjustment and terminal of the camera triggering mode of locking screen interface| CN112165576A|2020-09-25|2021-01-01|Oppo(重庆)智能科技有限公司|Image display method, image display device, storage medium and electronic equipment|
法律状态:
2017-04-24| PLFP| Fee payment|Year of fee payment: 2 | 2017-05-26| PLSC| Search report ready|Effective date: 20170526 | 2018-04-27| PLFP| Fee payment|Year of fee payment: 3 | 2019-01-30| PLFP| Fee payment|Year of fee payment: 4 | 2020-03-17| PLFP| Fee payment|Year of fee payment: 5 | 2021-03-22| PLFP| Fee payment|Year of fee payment: 6 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 KR1020150107536A|KR20170014356A|2015-07-29|2015-07-29|Mobile terminal and method of controlling the same| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|